Goto

Collaborating Authors

 polynomial system



UnlabeledPrincipalComponentAnalysis

Neural Information Processing Systems

Usingalgebraic geometry,weestablish that UPCA is a well-defined algebraic problem in the sense that the only matrices of minimal rank that agree with the given data are row-permutations of the ground-truth matrix, arising as the unique solutions of a polynomial system of equations.



Global Lyapunov functions: a long-standing open problem in mathematics, with symbolic transformers

Neural Information Processing Systems

Despite their spectacular progress, language models still struggle on complex reasoning tasks, such as advanced mathematics.We consider a long-standing open problem in mathematics: discovering a Lyapunov function that ensures the global stability of a dynamical system. This problem has no known general solution, and algorithmic solvers only exist for some small polynomial systems.We propose a new method for generating synthetic training samples from random solutions, and show that sequence-to-sequence transformers trained on such datasets perform better than algorithmic solvers and humans on polynomial systems, and can discover new Lyapunov functions for non-polynomial systems.





Learning to compute Gröbner bases

Neural Information Processing Systems

In this study, we investigate Gröbner basis computation from a learning perspective, envisioning it as a practical compromise to address large-scale polynomial system solving and understanding when mathematical algorithms are computationally intractable.


Solving ill-conditioned polynomial equations using score-based priors with application to multi-target detection

Beinhorn, Rafi, Kreymer, Shay, Balanov, Amnon, Cohen, Michael, Zabatani, Alon, Bendory, Tamir

arXiv.org Machine Learning

Recovering signals from low-order moments is a fundamental yet notoriously difficult task in inverse problems. This recovery process often reduces to solving ill-conditioned systems of polynomial equations. In this work, we propose a new framework that integrates score-based diffusion priors with moment-based estimators to regularize and solve these nonlinear inverse problems. This introduces a new role for generative models: stabilizing polynomial recovery from noisy statistical features. As a concrete application, we study the multi-target detection (MTD) model in the high-noise regime. We demonstrate two main results: (i) diffusion priors substantially improve recovery from third-order moments, and (ii) they make the super-resolution MTD problem, otherwise ill-posed, feasible. Numerical experiments on MNIST data confirm consistent gains in reconstruction accuracy across SNR levels. Our results suggest a promising new direction for combining generative priors with nonlinear polynomial inverse problems.